Received: from io.org by netcom2.netcom.com (8.6.12/Netcom)
id FAA10234; Wed, 19 Apr 1995 05:08:04 -0700
Received: from ajp-pc.net5c.io.org (ajp-pc.net5c.io.org [199.166.192.211]) by io.org (8.6.9/8.6.9) with SMTP id IAA22770 for <lightwave-l@netcom.com>; Wed, 19 Apr 1995 08:08:13 -0400
Date: Wed, 19 Apr 1995 08:08:13 -0400
Message-Id: <199504191208.IAA22770@io.org>
X-Sender: ajp@io.org
X-Mailer: Windows Eudora Version 2.0.3
Mime-Version: 1.0
Content-Type: text/plain; charset="us-ascii"
To: lightwave-l@netcom.com
From: ajp@io.org (Anthony James Paterson)
Subject: Re: Need suggestions....
Sender: owner-lightwave-l@netcom.com
Precedence: bulk
>
> How can I superimpose two images without a keyer?!? I only have a
>PAR and a TBC IV hooked up.. I want to take live action and add lightwave
>effects to it and vice versa ( have a lightwave rendered environment,and
>have real people walk through it)
> I know it can be done, I just don't know how.... Can anyone walk
>me through such a process?!?!? (
> I Also have ADPro, Image FX Dpaint Lightwave 3.5 and a few other
>graphic programs if I need to use any of them...
>
Hi Adrian!
We used a few different techniques to do this type of work on Robocop,
depending upon what the original footage was, how it was shot, and what the
desired effect was.
1. If you are using just an amiga, you can use Adpro & Dpaint or Brilliance to
create a moving matte of your "live" footage. First you export the captured
video from the PAR & TBC IV, then use Adpro to create a 4 or 16 greyscale anim
that uses only 3 or 15 colour-planes respectively. Then load the anim into
Dpaint or Brilliance and use the remaining un-used colour-plane to matte out the
people or other information you want to keep over the length of the animation.
The nice part about this technique is that by making the matte an amiga anim,
you get near real-time feedback of how your matte will look over the "live"
video and you get to use all the nice paint tools that these programs provide.
When you're happy with how the matte looks, you just adjust the palette so that
the original "live" colours are black, and the matte colour is white. The
resulting anim will then contain just white matte information over black, and
can be used with Adpro or Lightwave as a transparency map for compositing the
original source frames. It also helps to run the anim through an Adpro blur
with a small value to effectively anti-alias the edges of the matte.
We used this technique to matte out portions of live video in order composite in
3D objects that appeared behind live objects like the baby carriage mentioned
in the VTU article about the show. We also used variations of this technique to
track objects for Robovision targetting sequences and to paint-in "organic"
looking effects like the arcs and sparks of electricity to live scenes.
2. Another easy way to do the above if you have access to other platforms, is to
use the cookie-cut feature in Elastic Reality for Windows, Mac, or SGI to
cut the
live foreground elements out of the background. The advantage of this method is
that it saves a number of steps and tracking the matte consists of just moving a
shape over a series of frames rather than actually painting in each matte frame.
The disadvantage is that you don't get quite as much real-time feedback as
working with an anim. You have to render off quicktime or wireframe previews to
see how it looks "live"
Obviously, either method is painful compared to using a quality chromakeyer
like an Ultimatte, but as few people have access to expensive equipment like
this, these methods will let you get the job done, with a good deal of